Maximising Entropy Efficiently
نویسنده
چکیده
Determining a prior probability function via the maximum entropy principle can be a computationally intractable task. However one can easily determine — in advance of entropy maximisation — a list of probabilistic independencies that the maximum entropy function will satisfy. These independencies can be used to reduce the complexity of the entropy maximisation task. In particular, one can use these independencies to construct a directed acyclic graph in a Bayesian network, and then maximise entropy with respect to the numerical parameters of this network. This can result in an efficient representation of a prior probability function, and one that may allow efficient updating and marginalisation. The computational complexity of maximising entropy can be further reduced when knowledge of causal relationships is available. Moreover, the proposed simplification of the entropy maximisation task may be exploited to construct a proof theory for probabilistic logic.
منابع مشابه
Relative and Discrete Utility Maximising Entropy
The notion of utility maximising entropy (u-entropy) of a probability density, which was introduced and studied in [SZ04], is extended in two directions. First, the relative u-entropy of two probability measures in arbitrary probability spaces is defined. Then, specialising to discrete probability spaces, we also introduce the absolute u-entropy of a probability measure. Both notions are based ...
متن کاملA dual neural network for solving entropy-maximising models
The entropy-maximixing model has been applied with varying degrees of success in the analysis and planning of origin ^ destination types of spatial interaction. Although theoretical underpinnings and solution methods have been developed over the years, there are still outstanding problems that need to be thoroughly investigated. From the practical point of view, solving this model directly and ...
متن کاملEntropy of Hyperbolic Buildings
We characterize the volume entropy of a regular building as the topological pressure of the geodesic flow on an apartment. We show that the entropy maximizing measure is not Liouville measure for any regular hyperbolic building. As a consequence, we obtain a strict lower bound on the volume entropy in terms of the branching numbers and the volume of the boundary polyhedrons.
متن کاملGeneralised information and entropy measures in physics
The formalism of statistical mechanics can be generalised by starting from more general measures of information than the Shannon entropy and maximising those subject to suitable constraints. We discuss some of the most important examples of information measures that are useful for the description of complex systems. Examples treated are the Rényi entropy, Tsallis entropy, Abe entropy, Kaniadaki...
متن کاملSome results concerning maximum Rényi entropy distributions
English abstract: We consider the Student-t and Student-r distributions, which maximise Rényi entropy under a covariance condition. We show that they have information-theoretic properties which mirror those of the Gaussian distributions, which maximise Shannon entropy under the same condition. We introduce a convolution which preserves the Rényi maximising family, and show that the Rényi maximi...
متن کامل